Search results for "Bayesian [statistics]"
showing 10 items of 228 documents
Learning Bayesian Metanetworks from Data with Multilevel Uncertainty
2006
Managing knowledge by maintaining it according to dynamic context is among the basic abilities of a knowledge-based system. The two main challenges in managing context in Bayesian networks are the introduction of contextual (in)dependence and Bayesian multinets. We are presenting one possible implementation of a context sensitive Bayesian multinet-the Bayesian Metanetwork, which implies that interoperability between component Bayesian networks (valid in different contexts) can be also modelled by another Bayesian network. The general concepts and two kinds of such Metanetwork models are considered. The main focus of this paper is learning procedure for Bayesian Metanetworks.
Japan's FDI drivers in a time of financial uncertainty. New evidence based on Bayesian Model Averaging
2021
En este artículo analizamos los determinantes del stock de FDI saliente de Japón para el período 1996–2017. Este período es especialmente relevante ya que abarca un proceso de creciente globalización económica y dos crisis financieras. Para ello, consideramos un amplio conjunto de variables candidatas basadas en la teoría, así como en análisis empíricos previos. Nuestra muestra incluye un total de 27 países anfitriones. Seleccionamos las covariables utilizando una metodología basada en datos, el análisis Bayesian Model Averaging (BMA). Además, también analizamos si estos determinantes cambian según el grado de desarrollo (emergentes vs desarrollados) o las áreas geográficas (UE vs Asia Orie…
A Generalized Missing-Indicator Approach to Regression with Imputed Covariates
2011
We consider estimation of a linear regression model using data where some covariate values are missing but imputations are available to fill in the missing values. This situation generates a tradeoff between bias and precision when estimating the regression parameters of interest. Using only the subsample of complete observations does not cause bias but may imply a substantial loss of precision because the complete cases may be too few. On the other hand, filling in the missing values with imputations may cause bias. We provide the new Stata command gmi, which handles such tradeoff by using either model reduction or Bayesian model averaging techniques in the context of the generalized miss…
Use of hierarchical Bayesian framework in MTS studies to model different causes and novel possible forms of acquired MTS
2015
Abstract: An integrative account of MTS could be cast in terms of hierarchical Bayesian inference. It may help to highlight a central role of sensory (tactile) precision could play in MTS. We suggest that anosognosic patients, with anesthetic hemisoma, can also be interpreted as a form of acquired MTS, providing additional data for the model.
Deep Importance Sampling based on Regression for Model Inversion and Emulation
2021
Understanding systems by forward and inverse modeling is a recurrent topic of research in many domains of science and engineering. In this context, Monte Carlo methods have been widely used as powerful tools for numerical inference and optimization. They require the choice of a suitable proposal density that is crucial for their performance. For this reason, several adaptive importance sampling (AIS) schemes have been proposed in the literature. We here present an AIS framework called Regression-based Adaptive Deep Importance Sampling (RADIS). In RADIS, the key idea is the adaptive construction via regression of a non-parametric proposal density (i.e., an emulator), which mimics the posteri…
Conditional measures and their applications to fuzzy sets
1991
Abstract Given a ⊥-decomposable measure with respect to a continuous t-conorm, as introduced by the author in an earlier paper (see Section 1), we can construct ⊥-conditional measures as implications. These fulfil a ‘generalized product law’ replacing the product in the classical law by any other strict t-norm ⊥ and turn out to be decomposable with respect to an operation ⊥ V depending on ⊥, ⊥ and the condition set V (Section 2). More general, conditional measures are introduced axiomatically and are shown to be ⊥-conditional measures with respect to some ⊥-decomposable measure (Section 3). ‘Bayesian-like’ models are given which are alternatives to that presented by the author in a recent p…
Model comparison and selection for stationary space–time models
2007
An intensive simulation study to compare the spatio-temporal prediction performances among various space-time models is presented. The models having separable spatio-temporal covariance functions and nonseparable ones, under various scenarios, are also considered. The computational performance among the various selected models are compared. The issue of how to select an appropriate space-time model by accounting for the tradeoff between goodness-of-fit and model complexity is addressed. Performances of the two commonly used model-selection criteria, Akaike information criterion and Bayesian information criterion are examined. Furthermore, a practical application based on the statistical ana…
Population Properties of Compact Objects from the Second LIGO-Virgo Gravitational-Wave Transient Catalog
2021
Abbott, R., et al. (LIGO and Virgo Collaboration)
Particle Group Metropolis Methods for Tracking the Leaf Area Index
2020
Monte Carlo (MC) algorithms are widely used for Bayesian inference in statistics, signal processing, and machine learning. In this work, we introduce an Markov Chain Monte Carlo (MCMC) technique driven by a particle filter. The resulting scheme is a generalization of the so-called Particle Metropolis-Hastings (PMH) method, where a suitable Markov chain of sets of weighted samples is generated. We also introduce a marginal version for the goal of jointly inferring dynamic and static variables. The proposed algorithms outperform the corresponding standard PMH schemes, as shown by numerical experiments.
Prior-based Bayesian information criterion
2019
We present a new approach to model selection and Bayes factor determination, based on Laplace expansions (as in BIC), which we call Prior-based Bayes Information Criterion (PBIC). In this approach, the Laplace expansion is only done with the likelihood function, and then a suitable prior distribution is chosen to allow exact computation of the (approximate) marginal likelihood arising from the Laplace approximation and the prior. The result is a closed-form expression similar to BIC, but now involves a term arising from the prior distribution (which BIC ignores) and also incorporates the idea that different parameters can have different effective sample sizes (whereas BIC only allows one ov…